منابع مشابه
Multiparameter Hierarchical Clustering Methods
We propose an extension of hierarchical clustering methods, called multiparameter hierarchical clustering methods which are designed to exhibit sensitivity to density while retaining desirable theoretical properties. The input of the method we propose is a triple pX, d, fq, where pX, dq is a finite metric space and f : X Ñ R is a function defined on the data X, which could be a density estimate...
متن کاملSemi-Stochastic Gradient Descent Methods
In this paper we study the problem of minimizing the average of a large number (n) of smooth convex loss functions. We propose a new method, S2GD (Semi-Stochastic Gradient Descent), which runs for one or several epochs in each of which a single full gradient and a random number of stochastic gradients is computed, following a geometric law. The total work needed for the method to output an ε-ac...
متن کاملIteration Complexity of Feasible Descent Methods Iteration Complexity of Feasible Descent Methods for Convex Optimization
In many machine learning problems such as the dual form of SVM, the objective function to be minimized is convex but not strongly convex. This fact causes difficulties in obtaining the complexity of some commonly used optimization algorithms. In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In partic...
متن کاملAdvances in multiparameter optimization methods for de novo drug design.
INTRODUCTION A high-quality drug must achieve a balance of physicochemical and absorption, distribution, metabolism and elimination properties, safety and potency against its therapeutic target(s). Multiparameter optimization (MPO) methods guide the simultaneous optimization of multiple factors to quickly target compounds with the highest chance of downstream success. MPO can be combined with '...
متن کاملOn Spectral Properties of Steepest Descent Methods
In recent years it has been made more and more clear that the critical issue in gradient methods is the choice of the step length, whereas using the gradient as search direction may lead to very effective algorithms, whose surprising behaviour has been only partially explained, mostly in terms of the spectrum of the Hessian matrix. On the other hand, the convergence of the classical Cauchy stee...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Linear Algebra and its Applications
سال: 1999
ISSN: 0024-3795
DOI: 10.1016/s0024-3795(99)00112-3